23 research outputs found

    Alternative Methods of Seasonal Adjustment

    Get PDF
    Alternative methods for the seasonal adjustment of economic data are described that operate in the time domain and in the frequency domain. The time-domain method, which employs a classical comb filter, mimics the effects of the model-based procedures of the SEATS–TRAMO and STAMP programs. The frequency-domain method eliminates the sinusoidal elements of which, in the judgment of the user, the seasonal component is composed. It is proposed that, in some circumstances, seasonal adjustment is best achieved by eliminating all elements in excess of the frequency that marks the upper limit of the trend-cycle component of the data. It is argued that the choice of the method seasonal adjustment is liable to affect the determination of the turning points of the business cycle.Wiener–Kolmogorov Filtering; Frequency-Domain Methods; The Trend-Cycle Component

    Real-time prediction with U.K. monetary aggregates in the presence of model uncertainty

    Get PDF
    A popular account for the demise of the U.K.’s monetary targeting regime in the 1980s blames the fluctuating predictive relationships between broad money and inflation and real output growth. Yet ex post policy analysis based on heavily revised data suggests no fluctuations in the predictive content of money. In this paper, we investigate the predictive relationships for inflation and output growth using both real-time and heavily revised data. We consider a large set of recursively estimated vector autoregressive (VAR) and vector error correction models (VECM). These models differ in terms of lag length and the number of cointegrating relationships. We use Bayesian model averaging (BMA) to demonstrate that real-time monetary policymakers faced considerable model uncertainty. The in-sample predictive content of money fluctuated during the 1980s as a result of data revisions in the presence of model uncertainty. This feature is only apparent with real-time data as heavily revised data obscure these fluctuations. Out-of-sample predictive evaluations rarely suggest that money matters for either inflation or real output. We conclude that both data revisions and model uncertainty contributed to the demise of the U.K.’s monetary targeting regime

    Time series decompostion and business cycle analysis

    No full text
    Available from British Library Document Supply Centre- DSC:DXN054514 / BLDSC - British Library Document Supply CentreSIGLEGBUnited Kingdo

    Forecasting exchange rates using panel model and model averaging

    No full text
    We propose to produce accurate point and interval forecasts of exchange rates by combining a number of well known fundamental based panel models. Combination of each model utilizes a set of weights computed using a linear mixture of experts's framework, where weights are determined by log scores assigned to each model's predictive performance. As well as model uncertainty, we take potential structural break in the parameters of the models into consideration. In our application, to quarterly data for ten currencies (including the Euro) for the period 1990q1–2008q4, we show that the forecasts from ensemble models produce mean and interval forecasts that outperform equal weight, and to a lesser extent random walk benchmark models. The gain from combining forecasts is particularly pronounced for longer-horizon forecasts for central forecasts, but much less so for interval forecasts. Calculations of the probability of the exchange rate rising or falling using the combined or ensemble model show a good correspondence with known events and potentially provide a useful measure for uncertainty of whether the exchange rate is likely to rise or fall

    Alternative Methods of Seasonal Adjustment

    Full text link
    Alternative methods for the seasonal adjustment of economic data are described that operate in the time domain and in the frequency domain. The time-domain method, which employs a classical comb filter, mimics the effects of the model-based procedures of the SEATS– TRAMO and STAMP programs. The frequency-domain method eliminates the sinusoidal elements of which, in the judgement of the user, the seasonal component is composed. It is proposed that, in some circumstances, seasonal adjustment is best achieved by eliminating all elements in excess of the frequency that marks the upper limit of the trend-cycle component of the data. It is argued that the choice of the method seasonal adjustment is liable to affect the determination of the turning points of the business cycle

    Three essays in macroeconomics and monetary economics using Bayesian multivariate smooth transition approaches

    No full text
    The first essay introduces a Bayesian logistic smooth transition vector autoregression (LSTVAR) approach to investigating the impact of international business cycles on the UK economy. We find that the British business cycle is asymmetrically influenced by growth in the US, France and Germany. Overall, positive and negative shocks generating in the US or France affect the UK in the same directions as the shock. However, a shock emanating from Germany always exerts negative cumulative effects on the UK. Further, a positive shock arising from Germany adversely affects the UK output growth more than a negative shock of the same size. The second essay proposes a Bayesian method to investigating the purchasing power parity (PPP) utilizing an exponential smooth transition vector error correction model (ESTVECM). Employing a simple Gibbs sampler, we jointly estimate the cointegrating relationship along with the nonlinearities caused by the departures from the long run equilibrium. By allowing for symmetric regime changes, we provide strong evidence that PPP holds between the US and each of the remaining G7 countries. The model we employed implies that the dynamics of the PPP deviations can be rather complex, which is attested to by the impulse response analysis. The final essay proposes a Bayesian approach to exploring money-output causality within a logistic smooth transition vector error correction framework (LSTVECM). Our empirical results provide substantial evidence that the postwar US money-output relationship is nonlinear, with regime changes mainly governed by the lagged inflation rates. More importantly, we obtain strong support for long-run non-causality and nonlinear Granger-causality from money to output. Furthermore, our impulse response analysis reveals that a shock to money appears to have a negative aggregate impact on real output over the next fifty years, which calls for more caution when using money as a policy instrument.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Finite sample distributions and non-normality in second generation panel unit root tests

    No full text
    As a remarkable advantage, panel unit root testing statistics present Gaussian distribution in the limit rather than the complicated functionals of Wiener processes compared with traditional single time series tests. Therefore, the asymptotic critical values are directly used and the finite sample performance is not given proper attention. In addition, the unit root test literature heavily relies on the normality assumption, when this condition fails, the asymptotic results are no longer valid. This thesis analyzes and finds serious finite sample bias in panel unit root tests and the systematic impact of non-normality on the tests. Using Monte Carlo simulations, in particular, the application of response surface analysis with newly designed functional forms of response surface regressions, the thesis demonstrates the trend patterns of finite sample bias and test bias vary closely in relation to the variation in sample size and the degree of non-normality, respectively. Finite sample critical values are then proposed, more importantly, the finite sample critical values are augmented by the David-Johnson estimate of percentile standard deviation to account for the randomness incurred by stochastic simulations. Non-normality is modeled by the LĂ©vy-Paretian stable distribution. Certain degree of non-normality is found which causes so severe test dis-tortion that the finite sample critical values computed under normality are no longer va-lid. It provides important indications to the reliability of panel unit root test results when empirical data exhibit non-normality. Finally, a panel of OECD country inflation rates is examined for stationarity considering its feature of structural breaks. Instead of constructing structural breaks in panel unit root tests, an alternative and new approach is proposed by treating the breaks as a type of non-normality. With the help of earlier results in the thesis, the study supports the presence of unit root in inflation rates.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    On suboptimality of the Hodrick–Prescott filter at time series endpoints

    No full text
    Abstract The Hodrick-Prescott filter is often applied to economic series as part of the study of business cycles. Its properties have most frequently been explored through the development of essentially asymptotic results which are practically relevant only some distance from series endpoints. Our concern here is with the most recent observations, as policy-makers will often require an assessment of whether, and by how much, an economic variable is ''above trend''. We show that if such an issue is important, an easily implemented adjustment to the filter is desirable
    corecore